Primary Library
<PackageReference Include="Azure.Storage.Blobs" Version="12.21.2" />Dario Airoldi
November 3, 2025
Azure Blob Storage is Microsoft’s object storage solution for the cloud, providing massively scalable and secure storage for unstructured data such as documents, media files, backups, and application data.
This guide covers the available approaches and libraries for accessing Azure Blob Storage using C#.
Azure.Storage.Blobs⚠️ Important: Legacy SDKs should not be used for new projects and existing applications should migrate to Azure.Storage.Blobs.
<!-- For dependency injection -->
<PackageReference Include="Microsoft.Extensions.DependencyInjection" Version="8.0.0" />
<PackageReference Include="Microsoft.Extensions.Configuration" Version="8.0.0" />
<!-- For managed identity authentication -->
<PackageReference Include="Azure.Identity" Version="1.12.0" />
<!-- For advanced data movement operations -->
<PackageReference Include="Azure.Storage.DataMovement.Blobs" Version="12.0.0-beta.6" />The Azure.Storage.Blobs SDK provides three main client types for different levels of operations:
| Client Type | BlobServiceClient | BlobContainerClient | BlobClient |
|---|---|---|---|
| Scope | Entire storage account | Single container | Single blob |
| Operations | Account-level management | Container and blob operations | Blob-specific operations |
| Use Cases | List containers, account properties | Upload, download, list blobs | Read, write, delete specific blob |
| Authentication | Account-level permissions | Container-level permissions | Blob-level permissions |
BlobServiceClient manages storage account-level operations and serves as the entry point for accessing containers and blobs.
using Azure.Storage.Blobs;
using Azure.Identity;
// Using Managed Identity (recommended)
var credential = new DefaultAzureCredential();
var serviceClient = new BlobServiceClient(
new Uri("https://yourstorageaccount.blob.core.windows.net/"),
credential);
// Account-level operations
await foreach (var container in serviceClient.GetBlobContainersAsync())
{
Console.WriteLine($"Container: {container.Name}");
}
// Get account properties
var properties = await serviceClient.GetPropertiesAsync();
Console.WriteLine($"Account type: {properties.Value.AccountKind}");BlobContainerClient handles container-specific operations and blob management within a container.
// Get container client from service client
var containerClient = serviceClient.GetBlobContainerClient("mycontainer");
// Or create directly
var containerClient2 = new BlobContainerClient(
new Uri("https://yourstorageaccount.blob.core.windows.net/mycontainer"),
credential);
// Ensure container exists
await containerClient.CreateIfNotExistsAsync();
// Container operations
var properties = await containerClient.GetPropertiesAsync();
Console.WriteLine($"Container created: {properties.Value.CreatedOn}");
// List blobs in container
await foreach (var blobItem in containerClient.GetBlobsAsync())
{
Console.WriteLine($"Blob: {blobItem.Name} ({blobItem.Properties.ContentLength} bytes)");
}BlobClient handles individual blob operations like upload, download, and metadata management.
// Get blob client from container client
var blobClient = containerClient.GetBlobClient("myfile.txt");
// Or create directly
var blobClient2 = new BlobClient(
new Uri("https://yourstorageaccount.blob.core.windows.net/mycontainer/myfile.txt"),
credential);
// Check if blob exists
bool exists = await blobClient.ExistsAsync();
Console.WriteLine($"Blob exists: {exists}");
// Get blob properties
if (exists)
{
var properties = await blobClient.GetPropertiesAsync();
Console.WriteLine($"Blob size: {properties.Value.ContentLength} bytes");
Console.WriteLine($"Content type: {properties.Value.ContentType}");
Console.WriteLine($"Last modified: {properties.Value.LastModified}");
}The Azure.Storage.Blobs SDK abstracts the complexity of direct HTTP REST API calls by:
REST API Foundation: All operations translate to HTTP requests to Azure Blob Storage REST endpoints:
GET requests for download and list operationsPUT requests for upload and create operationsPOST requests for batch operationsDELETE requests for delete operationsAuthentication Handling: Automatically manages authentication headers (Azure AD tokens, SAS tokens, or account keys) for each REST call
Serialization/Deserialization: Handles conversion between .NET objects and JSON/XML used by the REST API
Error Translation: Transforms HTTP status codes into meaningful .NET exceptions with proper error messages
Connection Management: Implements HTTP connection pooling, timeouts, and retry logic
Azure Blob Storage supports multiple upload approaches for different scenarios:
// Upload a local file
string localFilePath = @"C:\temp\document.pdf";
string blobName = "documents/document.pdf";
var blobClient = containerClient.GetBlobClient(blobName);
// Simple upload with overwrite
await blobClient.UploadAsync(localFilePath, overwrite: true);
// Upload with options
var uploadOptions = new BlobUploadOptions
{
HttpHeaders = new BlobHttpHeaders
{
ContentType = "application/pdf",
ContentLanguage = "en-US"
},
Metadata = new Dictionary<string, string>
{
["UploadedBy"] = "MyApplication",
["UploadDate"] = DateTime.UtcNow.ToString("O")
},
Conditions = new BlobRequestConditions
{
IfNoneMatch = ETag.All // Only upload if blob doesn't exist
}
};
await blobClient.UploadAsync(localFilePath, uploadOptions);// Upload from memory stream
string content = "Hello, Azure Blob Storage!";
using var stream = new MemoryStream(Encoding.UTF8.GetBytes(content));
var blobClient = containerClient.GetBlobClient("messages/greeting.txt");
await blobClient.UploadAsync(stream, new BlobHttpHeaders
{
ContentType = "text/plain; charset=utf-8"
});
// Upload large stream with progress tracking
using var fileStream = File.OpenRead(@"C:\temp\largefile.zip");
var progress = new Progress<long>(bytesUploaded =>
{
Console.WriteLine($"Uploaded {bytesUploaded:N0} bytes");
});
await blobClient.UploadAsync(fileStream, new BlobUploadOptions
{
ProgressHandler = progress,
TransferOptions = new StorageTransferOptions
{
MaximumConcurrency = 4,
InitialTransferSize = 1024 * 1024, // 1 MB
MaximumTransferSize = 4 * 1024 * 1024 // 4 MB
}
});// Upload directly to Cool tier for cost optimization
await blobClient.UploadAsync(localFilePath, new BlobUploadOptions
{
AccessTier = AccessTier.Cool,
HttpHeaders = new BlobHttpHeaders { ContentType = "application/pdf" }
});
// Upload to Archive tier (lowest cost, highest access time)
await blobClient.UploadAsync(localFilePath, new BlobUploadOptions
{
AccessTier = AccessTier.Archive
});string blobName = "documents/report.pdf";
string localPath = @"C:\downloads\report.pdf";
var blobClient = containerClient.GetBlobClient(blobName);
// Check if blob exists before downloading
if (await blobClient.ExistsAsync())
{
// Simple download
await blobClient.DownloadToAsync(localPath);
// Download with progress tracking
var progress = new Progress<long>(bytesDownloaded =>
{
Console.WriteLine($"Downloaded {bytesDownloaded:N0} bytes");
});
await blobClient.DownloadToAsync(localPath, new BlobDownloadToOptions
{
ProgressHandler = progress,
TransferOptions = new StorageTransferOptions
{
MaximumConcurrency = 4
}
});
}var blobClient = containerClient.GetBlobClient("data/config.json");
// Download to memory stream
using var stream = new MemoryStream();
await blobClient.DownloadToAsync(stream);
// Read content as string
stream.Position = 0;
using var reader = new StreamReader(stream);
string content = await reader.ReadToEndAsync();
Console.WriteLine(content);var blobClient = containerClient.GetBlobClient("messages/note.txt");
// Download as BinaryData
BinaryData data = await blobClient.DownloadContentAsync();
string content = data.ToString();
// Download with response (includes metadata and properties)
var response = await blobClient.DownloadContentAsync();
string text = response.Value.Content.ToString();
var metadata = response.Value.Details.Metadata;
var contentType = response.Value.Details.ContentType;
Console.WriteLine($"Content: {text}");
Console.WriteLine($"Content Type: {contentType}");
Console.WriteLine($"Metadata count: {metadata.Count}");var blobClient = containerClient.GetBlobClient("documents/cached-report.pdf");
try
{
// Only download if modified since last check
var lastCheckTime = DateTime.UtcNow.AddHours(-1);
var response = await blobClient.DownloadContentAsync(new BlobDownloadOptions
{
Conditions = new BlobRequestConditions
{
IfModifiedSince = lastCheckTime
}
});
Console.WriteLine("Blob was modified, downloaded new version");
}
catch (RequestFailedException ex) when (ex.Status == 304)
{
Console.WriteLine("Blob not modified since last check");
}var blobClient = containerClient.GetBlobClient("large-files/dataset.csv");
// Download first 1KB only
var range = new HttpRange(0, 1024);
var response = await blobClient.DownloadStreamingAsync(new BlobDownloadOptions
{
Range = range
});
using var stream = response.Value.Content;
// Process first 1KB of the filevar blobClient = containerClient.GetBlobClient("config/settings.json");
// Simple overwrite
string newContent = JsonSerializer.Serialize(new { setting1 = "value1" });
await blobClient.UploadAsync(BinaryData.FromString(newContent), overwrite: true);
// Conditional overwrite (only if blob exists and has specific ETag)
try
{
var properties = await blobClient.GetPropertiesAsync();
await blobClient.UploadAsync(
BinaryData.FromString(newContent),
new BlobUploadOptions
{
Conditions = new BlobRequestConditions
{
IfMatch = properties.Value.ETag // Only update if ETag matches
}
});
}
catch (RequestFailedException ex) when (ex.Status == 412)
{
Console.WriteLine("Blob was modified by another process");
}var blobClient = containerClient.GetBlobClient("images/photo.jpg");
// Update metadata without changing blob content
var metadata = new Dictionary<string, string>
{
["ProcessedBy"] = Environment.MachineName,
["ProcessedAt"] = DateTime.UtcNow.ToString("O"),
["Version"] = "2.0"
};
await blobClient.SetMetadataAsync(metadata);var blobClient = containerClient.GetBlobClient("documents/manual.pdf");
// Update HTTP headers
var headers = new BlobHttpHeaders
{
ContentType = "application/pdf",
ContentLanguage = "en-US",
ContentDisposition = "attachment; filename=user-manual.pdf",
CacheControl = "public, max-age=31536000" // Cache for 1 year
};
await blobClient.SetHttpHeadersAsync(headers);var blobClient = containerClient.GetBlobClient("temp/temp-file.txt");
// Delete blob if it exists
var deleteResponse = await blobClient.DeleteIfExistsAsync();
if (deleteResponse.Value)
{
Console.WriteLine("Blob deleted successfully");
}
else
{
Console.WriteLine("Blob does not exist");
}
// Delete with conditions
await blobClient.DeleteAsync(DeleteSnapshotsOption.IncludeSnapshots,
conditions: new BlobRequestConditions
{
IfUnmodifiedSince = DateTime.UtcNow.AddDays(-1)
});var blobClient = containerClient.GetBlobClient("important/document.docx");
// Delete only if no snapshots exist
try
{
await blobClient.DeleteAsync();
}
catch (RequestFailedException ex) when (ex.ErrorCode == "SnapshotsPresent")
{
// Delete blob and all its snapshots
await blobClient.DeleteAsync(DeleteSnapshotsOption.IncludeSnapshots);
}// List deleted blobs (requires soft delete enabled on storage account)
var options = new BlobTraits { Metadata = true };
var states = new BlobStates { Deleted = true };
await foreach (var deletedBlob in containerClient.GetBlobsAsync(
traits: options, states: states))
{
Console.WriteLine($"Deleted blob: {deletedBlob.Name}, " +
$"Deleted on: {deletedBlob.Properties.DeletedOn}");
// Undelete if needed
var blobClient = containerClient.GetBlobClient(deletedBlob.Name);
await blobClient.UndeleteAsync();
Console.WriteLine($"Restored blob: {deletedBlob.Name}");
}Azure Blob Storage supports efficient batch operations for processing multiple blobs:
// Batch delete multiple blobs
var blobsToDelete = new List<string>
{
"temp/file1.txt",
"temp/file2.txt",
"temp/file3.txt"
};
var batchClient = serviceClient.GetBlobBatchClient();
var batch = batchClient.CreateBatch();
foreach (var blobName in blobsToDelete)
{
var blobClient = containerClient.GetBlobClient(blobName);
batch.DeleteBlob(blobClient.Uri);
}
// Execute batch operation
try
{
await batchClient.SubmitBatchAsync(batch);
Console.WriteLine($"Successfully deleted {blobsToDelete.Count} blobs");
}
catch (AggregateException ex)
{
// Handle individual failures
foreach (var innerEx in ex.InnerExceptions)
{
Console.WriteLine($"Failed to delete blob: {innerEx.Message}");
}
}// Move multiple blobs to Cool tier
var blobNames = await GetOldBlobNames(containerClient); // Custom method
var batchClient = serviceClient.GetBlobBatchClient();
var batch = batchClient.CreateBatch();
foreach (var blobName in blobNames)
{
var blobUri = new Uri($"{containerClient.Uri}/{blobName}");
batch.SetBlobAccessTier(blobUri, AccessTier.Cool);
}
await batchClient.SubmitBatchAsync(batch);
Console.WriteLine($"Moved {blobNames.Count} blobs to Cool tier");async Task UploadLargeFileAsync(string filePath, BlobClient blobClient)
{
using var fileStream = File.OpenRead(filePath);
var uploadOptions = new BlobUploadOptions
{
TransferOptions = new StorageTransferOptions
{
// Upload in parallel chunks
MaximumConcurrency = Environment.ProcessorCount,
// 8MB chunks for better throughput
MaximumTransferSize = 8 * 1024 * 1024,
// Start with 1MB initial transfer
InitialTransferSize = 1024 * 1024
},
ProgressHandler = new Progress<long>(bytesUploaded =>
{
var percentage = (double)bytesUploaded / fileStream.Length * 100;
Console.WriteLine($"Uploaded {percentage:F1}% ({bytesUploaded:N0}/{fileStream.Length:N0} bytes)");
})
};
await blobClient.UploadAsync(fileStream, uploadOptions);
}async Task ProcessLargeFileStreamingAsync(BlobClient blobClient)
{
var downloadInfo = await blobClient.DownloadStreamingAsync();
using var stream = downloadInfo.Value.Content;
// Process file in chunks without loading entire file into memory
var buffer = new byte[1024 * 1024]; // 1MB buffer
long totalBytesRead = 0;
while (true)
{
int bytesRead = await stream.ReadAsync(buffer);
if (bytesRead == 0) break;
totalBytesRead += bytesRead;
// Process chunk (e.g., hash calculation, parsing, etc.)
ProcessChunk(buffer.AsSpan(0, bytesRead));
Console.WriteLine($"Processed {totalBytesRead:N0} bytes");
}
}
void ProcessChunk(ReadOnlySpan<byte> chunk)
{
// Custom processing logic
// Example: calculate hash, parse data, etc.
}async Task UpdateBlobWithOptimisticConcurrency(BlobClient blobClient, string newContent)
{
int maxRetries = 5;
int retryCount = 0;
while (retryCount < maxRetries)
{
try
{
// Get current ETag
var properties = await blobClient.GetPropertiesAsync();
var currentETag = properties.Value.ETag;
// Update only if ETag hasn't changed
await blobClient.UploadAsync(
BinaryData.FromString(newContent),
new BlobUploadOptions
{
Conditions = new BlobRequestConditions
{
IfMatch = currentETag
}
});
Console.WriteLine("Blob updated successfully");
return;
}
catch (RequestFailedException ex) when (ex.Status == 412) // Precondition Failed
{
retryCount++;
Console.WriteLine($"Blob was modified by another process. Retry {retryCount}/{maxRetries}");
if (retryCount >= maxRetries)
{
throw new InvalidOperationException("Failed to update blob after maximum retries due to concurrent modifications");
}
// Wait before retry with exponential backoff
await Task.Delay(TimeSpan.FromMilliseconds(100 * Math.Pow(2, retryCount)));
}
}
}// Only process if blob was modified in the last hour
var blobClient = containerClient.GetBlobClient("data/sensor-readings.json");
var oneHourAgo = DateTimeOffset.UtcNow.AddHours(-1);
try
{
var response = await blobClient.DownloadContentAsync(new BlobDownloadOptions
{
Conditions = new BlobRequestConditions
{
IfModifiedSince = oneHourAgo
}
});
// Process new data
await ProcessSensorData(response.Value.Content.ToString());
}
catch (RequestFailedException ex) when (ex.Status == 304) // Not Modified
{
Console.WriteLine("No new sensor data available");
}// In Program.cs or Startup.cs
using Azure.Identity;
using Azure.Storage.Blobs;
using Microsoft.Extensions.DependencyInjection;
var builder = WebApplication.CreateBuilder(args);
// Register BlobServiceClient with DI
builder.Services.AddSingleton<BlobServiceClient>(provider =>
{
var configuration = provider.GetRequiredService<IConfiguration>();
var storageAccountName = configuration["Azure:StorageAccount:Name"];
var accountUri = new Uri($"https://{storageAccountName}.blob.core.windows.net");
return new BlobServiceClient(accountUri, new DefaultAzureCredential());
});
// Register typed clients for specific containers
builder.Services.AddSingleton<IDocumentStorageService, DocumentStorageService>();
var app = builder.Build();// Configuration class
public class BlobStorageOptions
{
public string ConnectionString { get; set; }
public string AccountName { get; set; }
public string DefaultContainer { get; set; }
public bool UseManagedIdentity { get; set; }
}
// Service registration
builder.Services.Configure<BlobStorageOptions>(
builder.Configuration.GetSection("BlobStorage"));
builder.Services.AddSingleton<BlobServiceClient>(provider =>
{
var options = provider.GetRequiredService<IOptions<BlobStorageOptions>>().Value;
if (options.UseManagedIdentity)
{
var accountUri = new Uri($"https://{options.AccountName}.blob.core.windows.net");
return new BlobServiceClient(accountUri, new DefaultAzureCredential());
}
else
{
return new BlobServiceClient(options.ConnectionString);
}
});
// Service implementation
public interface IDocumentStorageService
{
Task<string> UploadDocumentAsync(Stream document, string fileName);
Task<Stream> DownloadDocumentAsync(string fileName);
}
public class DocumentStorageService : IDocumentStorageService
{
private readonly BlobContainerClient _containerClient;
private readonly ILogger<DocumentStorageService> _logger;
public DocumentStorageService(BlobServiceClient blobServiceClient,
IOptions<BlobStorageOptions> options,
ILogger<DocumentStorageService> logger)
{
_containerClient = blobServiceClient.GetBlobContainerClient(options.Value.DefaultContainer);
_logger = logger;
}
public async Task<string> UploadDocumentAsync(Stream document, string fileName)
{
try
{
var blobClient = _containerClient.GetBlobClient(fileName);
await blobClient.UploadAsync(document, overwrite: true);
_logger.LogInformation("Document {FileName} uploaded successfully", fileName);
return blobClient.Uri.ToString();
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to upload document {FileName}", fileName);
throw;
}
}
public async Task<Stream> DownloadDocumentAsync(string fileName)
{
try
{
var blobClient = _containerClient.GetBlobClient(fileName);
var response = await blobClient.DownloadStreamingAsync();
_logger.LogInformation("Document {FileName} downloaded successfully", fileName);
return response.Value.Content;
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to download document {FileName}", fileName);
throw;
}
}
}Managed Identity is the most secure approach for Azure-hosted applications, eliminating the need for stored credentials.
using Azure.Identity;
using Azure.Storage.Blobs;
// Simple approach - uses system-assigned managed identity
var credential = new DefaultAzureCredential();
var blobServiceClient = new BlobServiceClient(
new Uri("https://mystorageaccount.blob.core.windows.net"),
credential);
// Verify authentication
try
{
var properties = await blobServiceClient.GetPropertiesAsync();
Console.WriteLine($"Successfully authenticated to {properties.Value.AccountKind} storage account");
}
catch (AuthenticationFailedException ex)
{
Console.WriteLine($"Authentication failed: {ex.Message}");
}// Specify user-assigned managed identity
var credential = new DefaultAzureCredential(new DefaultAzureCredentialOptions
{
ManagedIdentityClientId = "your-user-assigned-identity-client-id"
});
var blobServiceClient = new BlobServiceClient(
new Uri("https://mystorageaccount.blob.core.windows.net"),
credential);// For local development - uses Azure CLI credentials
var credential = new DefaultAzureCredential(new DefaultAzureCredentialOptions
{
ExcludeEnvironmentCredential = true,
ExcludeManagedIdentityCredential = true,
ExcludeSharedTokenCacheCredential = true,
ExcludeVisualStudioCredential = true,
ExcludeVisualStudioCodeCredential = true,
ExcludeAzureCliCredential = false, // Use Azure CLI
ExcludeInteractiveBrowserCredential = true
});
var blobServiceClient = new BlobServiceClient(
new Uri("https://mystorageaccount.blob.core.windows.net"),
credential);Connection strings are simple but less secure than managed identity. Use only for development or when managed identity isn’t available.
// From configuration
var connectionString = configuration.GetConnectionString("AzureStorage");
var blobServiceClient = new BlobServiceClient(connectionString);
// Directly specified (not recommended for production)
var connectionString = "DefaultEndpointsProtocol=https;AccountName=mystorageaccount;AccountKey=base64key;EndpointSuffix=core.windows.net";
var blobServiceClient = new BlobServiceClient(connectionString);
// Get specific container
var containerClient = blobServiceClient.GetBlobContainerClient("mycontainer");SAS tokens provide fine-grained, time-limited access to storage resources.
// Create BlobServiceClient with SAS token
var sasToken = "?sv=2021-06-08&ss=b&srt=sco&sp=rwdlacupx&se=2024-12-31T23:59:59Z&st=2024-01-01T00:00:00Z&spr=https&sig=signature";
var accountUri = new Uri("https://mystorageaccount.blob.core.windows.net");
var sasUri = new Uri($"{accountUri}{sasToken}");
var blobServiceClient = new BlobServiceClient(sasUri);// Generate container-level SAS token
var containerClient = new BlobContainerClient(connectionString, "mycontainer");
var sasBuilder = new BlobSasBuilder
{
BlobContainerName = containerClient.Name,
Resource = "c", // Container
ExpiresOn = DateTimeOffset.UtcNow.AddHours(2),
Protocol = SasProtocol.Https
};
// Set permissions
sasBuilder.SetPermissions(BlobContainerSasPermissions.Read |
BlobContainerSasPermissions.Write |
BlobContainerSasPermissions.List);
// Generate the SAS token
string sasToken = sasBuilder.ToSasQueryParameters(
new StorageSharedKeyCredential(accountName, accountKey)).ToString();
var sasUri = new Uri($"{containerClient.Uri}?{sasToken}");// Generate blob-specific SAS token
var blobClient = containerClient.GetBlobClient("document.pdf");
var sasBuilder = new BlobSasBuilder
{
BlobContainerName = blobClient.BlobContainerName,
BlobName = blobClient.Name,
Resource = "b", // Blob
ExpiresOn = DateTimeOffset.UtcNow.AddMinutes(30)
};
sasBuilder.SetPermissions(BlobSasPermissions.Read);
string sasToken = sasBuilder.ToSasQueryParameters(
new StorageSharedKeyCredential(accountName, accountKey)).ToString();
// Share this URL with clients for temporary access
string publicUrl = $"{blobClient.Uri}?{sasToken}";Account keys provide full access and should be avoided in production. Use only when other methods aren’t available.
// Using StorageSharedKeyCredential
var accountName = "mystorageaccount";
var accountKey = "base64-encoded-key";
var credential = new StorageSharedKeyCredential(accountName, accountKey);
var blobServiceClient = new BlobServiceClient(
new Uri($"https://{accountName}.blob.core.windows.net"),
credential);⚠️ Security Warning: Account keys provide full access to the storage account. If compromised, attackers have complete control over your data. Always prefer managed identity or SAS tokens.
| Environment | Recommended Approach | Alternative | Avoid |
|---|---|---|---|
| Azure App Service | System-Assigned Managed Identity | User-Assigned Managed Identity | Account Key |
| Azure Functions | System-Assigned Managed Identity | Connection String (dev only) | Account Key |
| Azure Container Instances | User-Assigned Managed Identity | SAS Token | Account Key |
| Azure Kubernetes Service | Workload Identity | Service Principal | Account Key |
| Local Development | Azure CLI + DefaultAzureCredential | Connection String | Account Key |
| CI/CD Pipelines | Service Principal | Federated Credentials | Account Key |
| External Applications | SAS Token | Service Principal | Account Key |
// Assign minimal required RBAC roles:
// - Storage Blob Data Reader: Read-only access
// - Storage Blob Data Contributor: Read/write access
// - Storage Blob Data Owner: Full access including permissions
// Use SAS tokens for limited-time access
var sasBuilder = new BlobSasBuilder
{
Resource = "b", // Blob-level access only
ExpiresOn = DateTimeOffset.UtcNow.AddHours(1), // Short expiration
Protocol = SasProtocol.Https // HTTPS only
};
sasBuilder.SetPermissions(BlobSasPermissions.Read); // Read-onlyvar retryOptions = new RetryOptions
{
MaxRetries = 3,
Delay = TimeSpan.FromMilliseconds(100),
MaxDelay = TimeSpan.FromSeconds(10),
Mode = RetryMode.Exponential
};
var clientOptions = new BlobClientOptions
{
Retry = retryOptions,
Transport = new HttpClientTransport(new HttpClient())
};
var blobServiceClient = new BlobServiceClient(accountUri, credential, clientOptions);// For large file uploads
var uploadOptions = new BlobUploadOptions
{
TransferOptions = new StorageTransferOptions
{
// Use multiple threads for parallel upload
MaximumConcurrency = Environment.ProcessorCount,
// Optimize chunk sizes based on content
MaximumTransferSize = 8 * 1024 * 1024, // 8 MB for large files
InitialTransferSize = 1024 * 1024 // 1 MB initial
}
};
// For small files, use smaller chunks
var smallFileOptions = new BlobUploadOptions
{
TransferOptions = new StorageTransferOptions
{
MaximumConcurrency = 1,
MaximumTransferSize = 256 * 1024 // 256 KB
}
};// Configure HttpClient for connection pooling
var httpClient = new HttpClient(new SocketsHttpHandler
{
PooledConnectionLifetime = TimeSpan.FromMinutes(15),
MaxConnectionsPerServer = 100
});
var clientOptions = new BlobClientOptions
{
Transport = new HttpClientTransport(httpClient)
};
var blobServiceClient = new BlobServiceClient(accountUri, credential, clientOptions);// ✅ Use prefix filtering for better performance
await foreach (var blob in containerClient.GetBlobsAsync(prefix: "logs/2024/"))
{
// Process only relevant blobs
}
// ✅ Limit results for pagination
var options = new BlobTraits { Metadata = false }; // Don't fetch metadata if not needed
await foreach (var page in containerClient.GetBlobsAsync(traits: options).AsPages(pageSizeHint: 100))
{
foreach (var blob in page.Values)
{
// Process in batches
}
}public class BlobStorageService
{
private readonly BlobServiceClient _blobServiceClient;
private readonly ICircuitBreakerPolicy _circuitBreaker;
public BlobStorageService(BlobServiceClient blobServiceClient)
{
_blobServiceClient = blobServiceClient;
_circuitBreaker = Policy
.Handle<RequestFailedException>(ex => ex.Status >= 500)
.CircuitBreakerAsync(
handledEventsAllowedBeforeBreaking: 5,
durationOfBreak: TimeSpan.FromMinutes(1),
onBreak: (exception, duration) =>
{
// Log circuit breaker opened
},
onReset: () =>
{
// Log circuit breaker closed
});
}
public async Task<BinaryData> DownloadWithCircuitBreakerAsync(string containerName, string blobName)
{
return await _circuitBreaker.ExecuteAsync(async () =>
{
var blobClient = _blobServiceClient
.GetBlobContainerClient(containerName)
.GetBlobClient(blobName);
var response = await blobClient.DownloadContentAsync();
return response.Value.Content;
});
}
}public async Task<bool> UploadWithResilience(Stream content, string containerName, string blobName)
{
var retryPolicy = Policy
.Handle<RequestFailedException>(ex =>
ex.Status == 500 || ex.Status == 502 || ex.Status == 503 || ex.Status == 504)
.Or<HttpRequestException>()
.Or<TaskCanceledException>()
.WaitAndRetryAsync(
retryCount: 3,
sleepDurationProvider: retryAttempt =>
TimeSpan.FromSeconds(Math.Pow(2, retryAttempt)) // Exponential backoff
);
try
{
await retryPolicy.ExecuteAsync(async () =>
{
content.Position = 0; // Reset stream position for retries
var blobClient = _blobServiceClient
.GetBlobContainerClient(containerName)
.GetBlobClient(blobName);
await blobClient.UploadAsync(content, overwrite: true);
});
return true;
}
catch (RequestFailedException ex)
{
// Log final failure
return false;
}
}// In startup/program
services.AddHealthChecks()
.AddCheck<BlobStorageHealthCheck>("blob_storage");
public class BlobStorageHealthCheck : IHealthCheck
{
private readonly BlobServiceClient _blobServiceClient;
public BlobStorageHealthCheck(BlobServiceClient blobServiceClient)
{
_blobServiceClient = blobServiceClient;
}
public async Task<HealthCheckResult> CheckHealthAsync(
HealthCheckContext context,
CancellationToken cancellationToken = default)
{
try
{
// Simple connectivity check
await _blobServiceClient.GetPropertiesAsync(cancellationToken);
return HealthCheckResult.Healthy("Blob Storage is accessible");
}
catch (Exception ex)
{
return HealthCheckResult.Unhealthy("Blob Storage is not accessible", ex);
}
}
}// Upload directly to appropriate tier
await blobClient.UploadAsync(content, new BlobUploadOptions
{
AccessTier = AccessTier.Cool // For infrequently accessed data
});
// Implement lifecycle management
public async Task ArchiveOldBlobs(BlobContainerClient containerClient)
{
var cutoffDate = DateTimeOffset.UtcNow.AddMonths(-6);
await foreach (var blob in containerClient.GetBlobsAsync())
{
if (blob.Properties.LastModified < cutoffDate)
{
var blobClient = containerClient.GetBlobClient(blob.Name);
await blobClient.SetAccessTierAsync(AccessTier.Archive);
}
}
}// Use batch operations for efficiency
var batchClient = _blobServiceClient.GetBlobBatchClient();
var batch = batchClient.CreateBatch();
foreach (var blobName in blobsToArchive)
{
var blobUri = new Uri($"{containerClient.Uri}/{blobName}");
batch.SetBlobAccessTier(blobUri, AccessTier.Archive);
}
await batchClient.SubmitBatchAsync(batch); // Single API call for multiple operations// Implement cost monitoring
public class BlobUsageMonitor
{
public async Task<BlobContainerUsage> GetContainerUsageAsync(BlobContainerClient containerClient)
{
long totalSize = 0;
int blobCount = 0;
var tierCounts = new Dictionary<string, int>();
await foreach (var blob in containerClient.GetBlobsAsync(BlobTraits.All))
{
totalSize += blob.Properties.ContentLength ?? 0;
blobCount++;
var tier = blob.Properties.AccessTier?.ToString() ?? "Unknown";
tierCounts[tier] = tierCounts.GetValueOrDefault(tier) + 1;
}
return new BlobContainerUsage
{
TotalSizeBytes = totalSize,
BlobCount = blobCount,
TierDistribution = tierCounts
};
}
}
public class BlobContainerUsage
{
public long TotalSizeBytes { get; set; }
public int BlobCount { get; set; }
public Dictionary<string, int> TierDistribution { get; set; }
}The WindowsAzure.Storage package was the original Azure Storage SDK and has been discontinued. Here’s how to migrate:
// Old SDK - DO NOT USE
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
var storageAccount = CloudStorageAccount.Parse(connectionString);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("mycontainer");
var blockBlob = container.GetBlockBlobReference("myblob.txt");
await blockBlob.UploadTextAsync("Hello World");
string content = await blockBlob.DownloadTextAsync();// New SDK - RECOMMENDED
using Azure.Storage.Blobs;
var blobServiceClient = new BlobServiceClient(connectionString);
var containerClient = blobServiceClient.GetBlobContainerClient("mycontainer");
var blobClient = containerClient.GetBlobClient("myblob.txt");
await blobClient.UploadAsync(BinaryData.FromString("Hello World"));
var response = await blobClient.DownloadContentAsync();
string content = response.Value.Content.ToString();The Microsoft.Azure.Storage.* packages were deprecated in October 2024:
// Deprecated SDK - MIGRATE IMMEDIATELY
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Blob;
var storageAccount = CloudStorageAccount.Parse(connectionString);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("mycontainer");
// Upload with metadata
var blob = container.GetBlockBlobReference("document.pdf");
blob.Metadata["author"] = "John Doe";
blob.Properties.ContentType = "application/pdf";
using (var fileStream = File.OpenRead(@"C:\document.pdf"))
{
await blob.UploadFromStreamAsync(fileStream);
}
await blob.SetMetadataAsync();
await blob.SetPropertiesAsync();// Modern SDK - USE THIS
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
var blobServiceClient = new BlobServiceClient(connectionString);
var containerClient = blobServiceClient.GetBlobContainerClient("mycontainer");
var blobClient = containerClient.GetBlobClient("document.pdf");
// Upload with metadata and properties in single operation
var uploadOptions = new BlobUploadOptions
{
HttpHeaders = new BlobHttpHeaders
{
ContentType = "application/pdf"
},
Metadata = new Dictionary<string, string>
{
["author"] = "John Doe"
}
};
using (var fileStream = File.OpenRead(@"C:\document.pdf"))
{
await blobClient.UploadAsync(fileStream, uploadOptions);
}<!-- Remove these deprecated packages -->
<PackageReference Include="WindowsAzure.Storage" Version="*" />
<PackageReference Include="Microsoft.Azure.Storage.Blob" Version="*" />
<PackageReference Include="Microsoft.Azure.Storage.Common" Version="*" />
<!-- Add modern package -->
<PackageReference Include="Azure.Storage.Blobs" Version="12.21.2" />
<PackageReference Include="Azure.Identity" Version="1.12.0" />// Old pattern
// var account = CloudStorageAccount.Parse(connectionString);
// var client = account.CreateCloudBlobClient();
// New pattern
var blobServiceClient = new BlobServiceClient(connectionString);
// Or with managed identity (recommended)
var blobServiceClient = new BlobServiceClient(
new Uri("https://account.blob.core.windows.net"),
new DefaultAzureCredential());| Legacy Method | Modern Equivalent | Notes |
|---|---|---|
UploadFromStreamAsync() |
UploadAsync(stream) |
Simplified API |
DownloadToStreamAsync() |
DownloadToAsync() |
Better error handling |
UploadTextAsync() |
UploadAsync(BinaryData.FromString()) |
Type-safe approach |
DownloadTextAsync() |
DownloadContentAsync().Value.Content.ToString() |
More explicit |
SetMetadataAsync() |
SetMetadataAsync() |
Same method name |
SetPropertiesAsync() |
SetHttpHeadersAsync() |
More specific naming |
GetSharedAccessSignature() |
GenerateSasUri() |
Built-in SAS generation |
ListBlobsSegmentedAsync() |
GetBlobsAsync().AsPages() |
Async enumerable |
// Legacy: Connection string only
// var account = CloudStorageAccount.Parse(connectionString);
// Modern: Multiple auth options
// Connection string (dev/test)
var client1 = new BlobServiceClient(connectionString);
// Managed Identity (production - recommended)
var client2 = new BlobServiceClient(accountUri, new DefaultAzureCredential());
// SAS Token
var client3 = new BlobServiceClient(sasUri);
// Account Key (not recommended)
var client4 = new BlobServiceClient(accountUri, new StorageSharedKeyCredential(name, key));Create a comprehensive test to verify migration:
[Test]
public async Task MigrationValidationTest()
{
var blobServiceClient = new BlobServiceClient(connectionString);
var containerClient = blobServiceClient.GetBlobContainerClient("test-container");
await containerClient.CreateIfNotExistsAsync();
// Test upload
var blobClient = containerClient.GetBlobClient("test-blob.txt");
var content = "Migration test content";
await blobClient.UploadAsync(BinaryData.FromString(content), overwrite: true);
// Test download
var downloadResponse = await blobClient.DownloadContentAsync();
var downloadedContent = downloadResponse.Value.Content.ToString();
Assert.AreEqual(content, downloadedContent);
// Test metadata
var metadata = new Dictionary<string, string> { ["test"] = "value" };
await blobClient.SetMetadataAsync(metadata);
var properties = await blobClient.GetPropertiesAsync();
Assert.AreEqual("value", properties.Value.Metadata["test"]);
// Cleanup
await containerClient.DeleteAsync();
}